A constructive algorithm to solve "convex recursive deletion" (CoRD) classification problems via two-layer perceptron networks

نویسندگان

  • Carlos Cabrelli
  • Ursula Molter
  • Ron Shonkwiler
چکیده

A sufficient condition that a region be classifiable by a two-layer feedforward neural net (a two-layer perceptron) using threshold activation functions is that either it be a convex polytope or that intersected with the complement of a convex polytope in its interior, or that intersected with the complement of a convex polytope in its interior or ... recursively. These have been called convex recursive deletion (CoRD) regions.We give a simple algorithm for finding the weights and thresholds in both layers for a feedforward net that implements such a region. The results of this work help in understanding the relationship between the decision region of a perceptron and its corresponding geometry in input space. Our construction extends in a simple way to the case that the decision region is the disjoint union of CoRD regions (requiring three layers). Therefore this work also helps in understanding how many neurons are needed in the second layer of a general three-layer network. In the event that the decision region of a network is known and is the union of CoRD regions, our results enable the calculation of the weights and thresholds of the implementing network directly and rapidly without the need for thousands of backpropagation iterations.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Implementation Feasibility of Convex Recursive Deletion Regions Using Multi-Layer Perceptrons

A constructive algorithm to implement convex recursive deletion regions via two-layer perceptrons has been presented in a recent study. In the algorithm, the absolute values of the weights become larger and larger when the number of nested layers of a convex recursive deletion region increases. In addition, the absolute values of the weights are determined according to the complexity of the str...

متن کامل

New Constructive Neural Network Architecture for Pattern Classification

Problem statement: Constructive neural network learning algorithms provide optimal ways to determine the architecture of a multi layer perceptron network along with learning algorithms for determining appropriate weights for pattern classification problems. These algorithms initially start with small network and dynamically allow the network to grow by adding and training neurons as needed unti...

متن کامل

Analysis and test of efficient methods for building recursive deterministic perceptron neural networks

The Recursive Deterministic Perceptron (RDP) feed-forward multilayer neural network is a generalisation of the single layer perceptron topology. This model is capable of solving any two-class classification problem as opposed to the single layer perceptron which can only solve classification problems dealing with linearly separable sets. For all classification problems, the construction of an R...

متن کامل

Output Reachable Set Estimation and Verification for Multi-Layer Neural Networks

In this paper, the output reachable estimation and safety verification problems for multi-layer perceptron neural networks are addressed. First, a conception called maximum sensitivity in introduced and, for a class of multi-layer perceptrons whose activation functions are monotonic functions, the maximum sensitivity can be computed via solving convex optimization problems. Then, using a simula...

متن کامل

Architecture Optimization Model for the Multilayer Perceptron and Clustering

This paper presents an approach called Architecture Optimization Model for the multilayer Perceptron. This approach permits to optimize the architectures for the multilayer Perceptron. The results obtained by the neural networks are dependent on their parameters. The architecture has a great impact on the convergence of the neural networks. More precisely, the choice of neurons in each hidden l...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • IEEE transactions on neural networks

دوره 11 3  شماره 

صفحات  -

تاریخ انتشار 2000